Entropy Estimate for Maps on Forests

نویسنده

  • M. Sabbaghan
چکیده مقاله:

A 1993 result of J. Llibre, and M. Misiurewicz, (Theorem A [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. Also a 1980 result of L.S. Block, J. Guckenheimer, M. Misiurewicz and L.S. Young (Lemma 1.5 [3]) states that if G is an A-graph of f then h(G) ? h( f ). In this paper we generalize Theorem A and Lemma 1.5 for continuous functions on forests. Let F be a forest and f : F?F be a continuous function. By using the adjacency matrix of a graph, we give a lower bound for the topological entropy of f.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

entropy estimate for maps on forests

a 1993 result of j. llibre, and m. misiurewicz, (theorem a [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. also a 1980 result of l.s. block, j. guckenheimer, m. misiurewicz and l.s. young (lemma 1.5 [3]) states that if g is an a-graph of f then h(g) ? h( f ). in this pap...

متن کامل

Entropy Estimate For High Dimensional Monotonic Functions

We establish upper and lower bounds for the metric entropy and bracketing entropy of the class of d-dimensional bounded monotonic functions under L norms. It is interesting to see that both the metric entropy and bracketing entropy have different behaviors for p < d/(d − 1) and p > d/(d − 1). We apply the new bounds for bracketing entropy to establish a global rate of convergence of the MLE of ...

متن کامل

Do Hebbian synapses estimate entropy?

Hebbian learning is one of the mainstays of biologically inspired neural processing. Hebb’s rule is biologically plausible, and it has been extensively utilized in both computational neuroscience and in unsupervised training of neural systems. In these fields, Hebbian learning became synonymous for correlation learning. But it is known that correlation is a second order statistic of the data, s...

متن کامل

Pre-image Entropy for Maps on Noncompact Topological Spaces

We propose a new definition of pre-image entropy for continuous maps on noncompact topological spaces, investigate fundamental properties of the new pre-image entropy, and compare the new pre-image entropy with the existing ones. The defined pre-image entropy generates that of Cheng and Newhouse. Yet, it holds various basic properties of Cheng and Newhouse’s pre-image entropy, for example, the ...

متن کامل

منابع من

با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ذخیره در منابع من قبلا به منابع من ذحیره شده

{@ msg_add @}


عنوان ژورنال

دوره 21  شماره 1

صفحات  -

تاریخ انتشار 2010-03-01

با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.

کلمات کلیدی

میزبانی شده توسط پلتفرم ابری doprax.com

copyright © 2015-2023